Lecture 2 The Chernoff bound and median-of-means amplification

ثبت نشده
چکیده

In the previous lecture we introduced a streaming algorithm for estimating the frequency moment F2. The algorithm used very little space, logarithmic in the length of the stream, provided we could store a random function h : {1, . . . , n} → {±1} “for free” in memory. In general this would require n bits of memory, one for each value of the function. However, observe that our analysis used the fact that h is a random function in a rather weak way: specifically, when computing the expectation and variance of the random variable Z = c describing the outcome of the algorithm we used conditions such as E[YiYjYkY`] = E[Yi] E[Yj] E[Yk] E[Y`] for distinct values i, j, k, `, where Yj = h(j) is the random variable that describes the output of the function at a particular point. As it turns out, this requirement is a much weaker requirement than full independence, called 4-wise independence. In particular, it is possible to sample “4-wise independent” functions using much fewer random bits than a uniformly random function. This is the idea behind derandomization: to try to save on the number of random coins needed while keeping the function h “random enough” that the analysis carries over.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lecture 8: September 26 8.1 Approximate Counting

Proof: Take t = O(log δ) trials of the original scheme and output the median. The reason this works is the following. In order for the median not to fall in the desired interval (1± ε)f(x), at least half of the t trials must fail to do so. However, each trial falls in the interval with probability at least 3 4 , so this is unlikely for large t. Specifically, we have Pr[the median is outside the...

متن کامل

Lecture 15: Chernoff bounds and Sequential detection

1 Chernoff Bounds 1.1 Bayesian Hypothesis Test A test using log-likelihood ratio statistic has the form, T (Y ) = logL(Y ) T τ. (1) Bound-1: The probability of error Pe is bounded as, Pe ≤ (π0 + π1e )eμT,0(s0)−s0τ , (2) where μT,0(s) = logE0[e ], and μ ′ T,0(s0) = τ . Bound-2: ∀ s ∈ [0, 1], Pe ≤ max(π0, π1e )eμT,0(s)−sτ . (3) Derivation of the above bound: Consider, Pe = π0P0(Γ1) + π1P1(Γ0), = ...

متن کامل

Lecture 2: Matrix Chernoff bounds

The purpose of my second and third lectures is to discuss spectral sparsifiers, which are the second key ingredient in most of the fast Laplacian solvers. In this lecture we will discuss concentration bounds for sums of random matrices, which are an important technical tool underlying the simplest sparsifier construction.

متن کامل

COS 521 Lecture 4 : Streaming Algorithms

The Chernoff bound is a tail bound for sums of independent real variables.

متن کامل

CS 229 r : Algorithms for Big Data Fall 2013 Lecture 4 — September 12 , 2013

2 Algorithm for Fp, p > 2 2 2.1 Alternate formulation of Chernoff bound . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Returning to proof of Theorem 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.3 Digression on Perfect Hashing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.4 Finishing proof of Theorem 1 . . . . . . . . . . . . . . . . . . . . . . . ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017